WordReference Random House Unabridged Dictionary of American English © 2025
Mar′kov proc′ess, [Statistics.]
- Statisticsa process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.
- after Russian mathematician Andreĭ Andreevich Markov (1856–1922), who developed it 1935–40
'Markov process' also found in these entries (note: many are not synonyms or translations):